Augmented 퓁1 and Nuclear-Norm Models with a Globally Linearly Convergent Algorithm

نویسندگان

  • Ming-Jun Lai
  • Wotao Yin
چکیده

This paper studies the long-existing idea of adding a nice smooth function to “smooth” a nondifferentiable objective function in the context of sparse optimization, in particular, the minimization of ‖x‖1 + 1 2α‖x‖ 2 2, where x is a vector, as well as the minimization of ‖X‖∗+ 1 2α‖X‖ 2 F , where X is a matrix and ‖X‖∗ and ‖X‖F are the nuclear and Frobenius norms of X, respectively. We show that they let sparse vectors and low-rank matrices be efficiently recovered. In particular, they enjoy exact and stable recovery guarantees similar to those known for the minimization of ‖x‖1 and ‖X‖∗ under the conditions on the sensing operator such as its null-space property, restricted isometry property, spherical section property, or “RIPless” property. To recover a (nearly) sparse vector x, minimizing ‖x‖1+ 1 2α‖x‖ 2 2 returns (nearly) the same solution as minimizing ‖x‖1 whenever α ≥ 10‖x‖∞. The same relation also holds between minimizing ‖X‖∗+ 1 2α‖X‖ 2 F and minimizing ‖X‖∗ for recovering a (nearly) low-rank matrix X if α ≥ 10‖X‖2. Furthermore, we show that the linearized Bregman algorithm, as well as its two fast variants, for minimizing ‖x‖1 + 1 2α‖x‖ 2 2 subject to Ax = b enjoys global linear convergence as long as a nonzero solution exists, and we give an explicit rate of convergence. The convergence property does not require a sparse solution or any properties on A. To our knowledge, this is the best known global convergence result for first-order sparse optimization algorithms.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A globally and quadratically convergent primal-dual augmented Lagrangian algorithm for equality constrained optimization

A globally and quadratically convergent primal–dual augmented Lagrangian algorithm for equality constrained optimization Paul Armand & Riadh Omheni To cite this article: Paul Armand & Riadh Omheni (2015): A globally and quadratically convergent primal–dual augmented Lagrangian algorithm for equality constrained optimization, Optimization Methods and Software, DOI: 10.1080/10556788.2015.1025401 ...

متن کامل

A Globally Convergent Linearly Constrained Lagrangian Method for Nonlinear Optimization

For optimization problems with nonlinear constraints, linearly constrained Lagrangian (LCL) methods sequentially minimize a Lagrangian function subject to linearized constraints. These methods converge rapidly near a solution but may not be reliable from arbitrary starting points. The well known example MINOS has proven effective on many large problems. Its success motivates us to propose a glo...

متن کامل

Finding Approximately Rank-One Submatrices with the Nuclear Norm and 퓁1-Norm

We propose a convex optimization formulation with the nuclear norm and `1-norm to find a large approximately rank-one submatrix of a given nonnegative matrix. We develop optimality conditions for the formulation and characterize the properties of the optimal solutions. We establish conditions under which the optimal solution of the convex formulation has a specific sparse structure. Finally, we...

متن کامل

Improving the Convergence of Non - Interior

Recently, based upon the Chen-Harker-Kanzow-Smale smoothing function and the trajectory and the neighbourhood techniques, Hotta and Yoshise proposed a non-interior point algorithm for solving the nonlinear com-plementarity problem. Their algorithm is globally convergent under a relatively mild condition. In this paper, we modify their algorithm and combine it with the superlinear convergence th...

متن کامل

Improving the convergence of non-interior point algorithms for nonlinear complementarity problems

Recently, based upon the Chen-Harker-Kanzow-Smale smoothing function and the trajectory and the neighbourhood techniques, Hotta and Yoshise proposed a noninterior point algorithm for solving the nonlinear complementarity problem. Their algorithm is globally convergent under a relatively mild condition. In this paper, we modify their algorithm and combine it with the superlinear convergence theo...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • SIAM J. Imaging Sciences

دوره 6  شماره 

صفحات  -

تاریخ انتشار 2013